Hardest One-Dimensional Subproblems

نویسندگان

  • David L. Donoho
  • Richard C. Liu
چکیده

For a long time, lower bounds on the difficulty of estimation have been constructed by showing that estimation was difficult even in certain 1-dimensional subproblems. The logical extension of this is to identify hardest one dimensional subproblems and to ask whether these are, either exactly or approximately, as difficult as the full problem. We do this in three settings: estimating linear functionals from observations with Gaussian noise, recovering linear functionals from observations with deterministic noise, and making confidence statements for linear functionals from observations with Gaussian noise. We show that the minimax value of the hardest subproblem is, in each case, equal to, or within a few percent of, the minimax value of the full problem. Sharpest known bounds on the asymptotic minimax risk and on the minimax confidence interval size follow from this approach. Also, new connections between statistical estimation and the theory of optimal recovery are established. For example, 95% confidence intervals based on estimators developed in the theory of optimal recovery are optimal among linear confidence procedures and within 19% of minimax among all procedures. Abbreviated Title: Hardest l-d subproblems AMS-MOS Subject Classifications Primary 62J05; secondary 62G35, 41A15.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Estimation of Quadratic Functionals

We discuss the difficulties of estimating quadratic functionals based on observations Y (t) from the white noise model Y (t) = Jf (u )du + cr W (t), t E [0,1], o where W (t) is a standard Wiener process on [0, 1]. The optimal rates of convergence (as cr -> 0) for estimating quadratic functionals under certain geometric constraints are 1 found. Specially, the optimal rates of estimating J[f (k)(...

متن کامل

Adaptively local I - dimensional subproblems

We provide a new insight of the difficulty of nonparametric estimation of a whole function. A new method is invented for finding a minimax lower bound of globally estimating a function. The idea is to adjust automatically the direction to the nearly hardest I-dimensional subproblem at each location, and to use locally the difficulty of I-dimensional subproblem. In a variety of contexts, our met...

متن کامل

Using the Breakout Algorithm to Identify Hard and Unsolvable Subproblems

Local search algorithms have been very successful for solving constraint satisfaction problems (CSP). However, a major weakness has been that local search is unable to detect unsolvability and is thus not suitable for highly constrained or overconstrained problems. In this paper, we present a scheme where a local search algorithm, the breakout algorithm, is used to identify hard or unsolvable s...

متن کامل

Limitations of Incremental Dynamic Programs

We consider so-called “incremental” dynamic programming algorithms, and are interested in the number of subproblems produced by them. The standard dynamic programming algorithm for the n-dimensional Knapsack problem is incremental, produces nK subproblems and nK relations (wires) between the subproblems, where K is the capacity of the knapsack. We show that any incremental algorithm for this pr...

متن کامل

Als Manuskript Gedruckt Technische Universität Dresden Herausgeber: Der Rektor Theoretical Investigations on the Modified Integer Round-up Property for the One-dimensional Cutting Stock Problem

Many numerical computations show an only small difference between the optimal value of the one-dimensional cutting stock problem and that of its corresponding linear programming relaxation. In this paper we investigate the one-dimensional cutting stock problem with respect to the modified integer round-up property (MIRUP) and present some results on subproblems having the MIRUP.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008